65 research outputs found

    Stability and Complexity of Minimising Probabilistic Automata

    Full text link
    We consider the state-minimisation problem for weighted and probabilistic automata. We provide a numerically stable polynomial-time minimisation algorithm for weighted automata, with guaranteed bounds on the numerical error when run with floating-point arithmetic. Our algorithm can also be used for "lossy" minimisation with bounded error. We show an application in image compression. In the second part of the paper we study the complexity of the minimisation problem for probabilistic automata. We prove that the problem is NP-hard and in PSPACE, improving a recent EXPTIME-result.Comment: This is the full version of an ICALP'14 pape

    A mathematical framework for contact detection between quadric and superquadric surfaces

    Get PDF
    The calculation of the minimum distance between surfaces plays an important role in computational mechanics, namely, in the study of constrained multibody systems where contact forces take part. In this paper, a general rigid contact detection methodology for non-conformal bodies, described by ellipsoidal and superellipsoidal surfaces, is presented. The mathematical framework relies on simple algebraic and differential geometry, vector calculus, and on the C2 continuous implicit representations of the surfaces. The proposed methodology establishes a set of collinear and orthogonal constraints between vectors defining the contacting surfaces that, allied with loci constraints, which are specific to the type of surface being used, formulate the contact problem. This set of non-linear equations is solved numerically with the Newton-Raphson method with Jacobian matrices calculated analytically. The method outputs the coordinates of the pair of points with common normal vector directions and, consequently, the minimum distance between both surfaces. Contrary to other contact detection methodologies, the proposed mathematical framework does not rely on polygonal-based geometries neither on complex non-linear optimization formulations. Furthermore, the methodology is extendable to other surfaces that are (strictly) convex, interact in a non-conformal fashion, present an implicit representation, and that are at least C2 continuous. Two distinct methods for calculating the tangent and binormal vectors to the implicit surfaces are introduced: (i) a method based on the Householder reflection matrix; and (ii) a method based on a square plate rotation mechanism. The first provides a base of three orthogonal vectors, in which one of them is collinear to the surface normal. For the latter, it is shown that, by means of an analogy to the referred mechanism, at least two non-collinear vectors to the normal vector can be determined. Complementarily, several mathematical and computational aspects, regarding the rigid contact detection methodology, are described. The proposed methodology is applied to several case tests involving the contact between different (super)ellipsoidal contact pairs. Numerical results show that the implemented methodology is highly efficient and accurate for ellipsoids and superellipsoids.Fundação para a Ciência e a Tecnologia (FCT

    High-dimensional maximum marginal likelihood item factor analysis by adaptive quadrature

    Full text link
    Although the Bock–Aitkin likelihood-based estimation method for factor analysis of dichotomous item response data has important advantages over classical analysis of item tetrachoric correlations, a serious limitation of the method is its reliance on fixed-point Gauss-Hermite (G-H) quadrature in the solution of the likelihood equations and likelihood-ratio tests. When the number of latent dimensions is large, computational considerations require that the number of quadrature points per dimension be few. But with large numbers of items, the dispersion of the likelihood, given the response pattern, becomes so small that the likelihood cannot be accurately evaluated with the sparse fixed points in the latent space. In this paper, we demonstrate that substantial improvement in accuracy can be obtained by adapting the quadrature points to the location and dispersion of the likelihood surfaces corresponding to each distinct pattern in the data. In particular, we show that adaptive G-H quadrature, combined with mean and covariance adjustments at each iteration of an EM algorithm, produces an accurate fast-converging solution with as few as two points per dimension. Evaluations of this method with simulated data are shown to yield accurate recovery of the generating factor loadings for models of upto eight dimensions. Unlike an earlier application of adaptive Gibbs sampling to this problem by Meng and Schilling, the simulations also confirm the validity of the present method in calculating likelihood-ratio chi-square statistics for determining the number of factors required in the model. Finally, we apply the method to a sample of real data from a test of teacher qualifications.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/43596/1/11336_2003_Article_1141.pd

    Pervasive gaps in Amazonian ecological research

    Get PDF
    Biodiversity loss is one of the main challenges of our time,1,2 and attempts to address it require a clear un derstanding of how ecological communities respond to environmental change across time and space.3,4 While the increasing availability of global databases on ecological communities has advanced our knowledge of biodiversity sensitivity to environmental changes,5–7 vast areas of the tropics remain understudied.8–11 In the American tropics, Amazonia stands out as the world’s most diverse rainforest and the primary source of Neotropical biodiversity,12 but it remains among the least known forests in America and is often underrepre sented in biodiversity databases.13–15 To worsen this situation, human-induced modifications16,17 may elim inate pieces of the Amazon’s biodiversity puzzle before we can use them to understand how ecological com munities are responding. To increase generalization and applicability of biodiversity knowledge,18,19 it is thus crucial to reduce biases in ecological research, particularly in regions projected to face the most pronounced environmental changes. We integrate ecological community metadata of 7,694 sampling sites for multiple or ganism groups in a machine learning model framework to map the research probability across the Brazilian Amazonia, while identifying the region’s vulnerability to environmental change. 15%–18% of the most ne glected areas in ecological research are expected to experience severe climate or land use changes by 2050. This means that unless we take immediate action, we will not be able to establish their current status, much less monitor how it is changing and what is being lostinfo:eu-repo/semantics/publishedVersio

    Pervasive gaps in Amazonian ecological research

    Get PDF
    Biodiversity loss is one of the main challenges of our time, and attempts to address it require a clear understanding of how ecological communities respond to environmental change across time and space. While the increasing availability of global databases on ecological communities has advanced our knowledge of biodiversity sensitivity to environmental changes, vast areas of the tropics remain understudied. In the American tropics, Amazonia stands out as the world's most diverse rainforest and the primary source of Neotropical biodiversity, but it remains among the least known forests in America and is often underrepresented in biodiversity databases. To worsen this situation, human-induced modifications may eliminate pieces of the Amazon's biodiversity puzzle before we can use them to understand how ecological communities are responding. To increase generalization and applicability of biodiversity knowledge, it is thus crucial to reduce biases in ecological research, particularly in regions projected to face the most pronounced environmental changes. We integrate ecological community metadata of 7,694 sampling sites for multiple organism groups in a machine learning model framework to map the research probability across the Brazilian Amazonia, while identifying the region's vulnerability to environmental change. 15%–18% of the most neglected areas in ecological research are expected to experience severe climate or land use changes by 2050. This means that unless we take immediate action, we will not be able to establish their current status, much less monitor how it is changing and what is being lost
    • …
    corecore